26 research outputs found

    Brain-Computer Interface meets ROS: A robotic approach to mentally drive telepresence robots

    Get PDF
    This paper shows and evaluates a novel approach to integrate a non-invasive Brain-Computer Interface (BCI) with the Robot Operating System (ROS) to mentally drive a telepresence robot. Controlling a mobile device by using human brain signals might improve the quality of life of people suffering from severe physical disabilities or elderly people who cannot move anymore. Thus, the BCI user is able to actively interact with relatives and friends located in different rooms thanks to a video streaming connection to the robot. To facilitate the control of the robot via BCI, we explore new ROS-based algorithms for navigation and obstacle avoidance, making the system safer and more reliable. In this regard, the robot can exploit two maps of the environment, one for localization and one for navigation, and both can be used also by the BCI user to watch the position of the robot while it is moving. As demonstrated by the experimental results, the user's cognitive workload is reduced, decreasing the number of commands necessary to complete the task and helping him/her to keep attention for longer periods of time.Comment: Accepted in the Proceedings of the 2018 IEEE International Conference on Robotics and Automatio

    Socially Assistive Robots for Inclusion

    No full text
    Building socially assistive robots, that are respected by and represent people, is a very challenging task, requiring a cross-disciplinary research. Certainly, on the one hand it is essential to make effort for advancing the Socially Assistive Robotics field, with the aim of facilitating any user to exploit these devices. On the other hand, this technology, to be successful, has to satisfy people, not only from a technical (performance) point of view, but especially in terms of human-robot interaction. In this paper we present some challenges we have been facing to make the socially assistive robots a platform to guarantee new inclusion opportunities

    Towards a Brain-Robot Interface for children

    No full text
    Brain-Computer Interface systems have been widely studied and explored with adults demonstrating the possibility to achieve augmentative communication and control directly from the users\u2019 brain. Nevertheless, the study and the exploitation of the BCI in children seems to be limited. In this paper we propose and present for the first time a Brain-Robot Interface enabling children to mentally drive a robot. With this regards, we exploit the combination of a P300-based Brain-Computer Interface and a shared-autonomy approach to achieve a reliable and safe robot navigation. We tested our system in a pilot study involving five children. Our preliminary results highlight the advantages of using an accumulation framework, thanks to which the performance of the children reached the 81.67 % \ub1 12.7 on average in terms of accuracy. During the experiments, the shared-autonomy approach involved a low-level intelligent control on board of the robot to avoid obstacles, enabling an effective navigation also with a small number of commands

    Hybrid Brain-Robot Interface for telepresence

    No full text
    Brain-Computer Interface (BCI) technology allows to use brain signals as an alternative channel to control external devices. In this work, we introduce an Hybrid Brain-Robot Interface to mentally drive mobile robots. The proposed system sets the direction of motion of the robot by combining two brain stimulation paradigms: motor imagery and visual event related potentials. The first enables the user to send turn-left or turn-right commands to the robot by a certain rotation angle, while the second enables the user to easily select high level goals for the robot in the environment. At the end, the system is integrated with a shared- autonomy approach in order to improve the interaction between the user and the intelligent robot, achieving a reliable and robust navigation

    Shared approaches to mentally drive telepresence robots

    No full text
    Recently there has been a growing interest in designing human-in-the-loop applicationsbased on shared approachesthat fuse the user\u2019s commands with the perception of the context. In this scenario, we focus on user-supervised telepresence robots, designed to improvethe quality of life of peoplesuffering from severe physical disabilities or elderlywho cannot move anymore.In this regard, we introduce brain-machine interfaces that enable usersto directly control the robot through their brain activity. Since the nature of this interface, characterized by low bit rate and noise, herein, we present different methodologiestoaugment the human-robot interaction and to facilitate the research and the development of these technologies
    corecore